14 research outputs found

    SENSITIVITY ANALYSIS OF COUPLED CRITICALITY CALCULATIONS

    Get PDF
    ABSTRACT Perturbation theory based sensitivity analysis is a vital part of todays' nuclear reactor design. This paper presents an extension of standard techniques to examine coupled criticality problems with mutual feedback between neutronics and an augmenting system (for example thermal-hydraulics). The proposed procedure uses a neutronic and an augmenting adjoint function to efficiently calculate the first order change in responses of interest due to variations of the parameters describing the coupled problem. The effect of the perturbations is considered in two different ways in our study: either a change is allowed in the power level while maintaining criticality (power perturbation) or a change is allowed in the eigenvalue while the power is constrained (eigenvalue perturbation). The calculated response can be the change in the power level, the reactivity worth of the perturbation, or the change in any functional of the flux, the augmenting dependent variables and the input parameters. To obtain power-and criticality-constrained sensitivities power-and k-reset procedures can be applied yielding identical results. Both the theoretical background and an application to a one dimensional slab problem are presented, along with an iterative procedure to compute the necessary adjoint functions using the neutronics and the augmenting codes separately, thus eliminating the need of developing new programs to solve the coupled adjoint problem

    Sub-second photon dose prediction via transformer neural networks

    Full text link
    Fast dose calculation is critical for online and real time adaptive therapy workflows. While modern physics-based dose algorithms must compromise accuracy to achieve low computation times, deep learning models can potentially perform dose prediction tasks with both high fidelity and speed. We present a deep learning algorithm that, exploiting synergies between Transformer and convolutional layers, accurately predicts broad photon beam dose distributions in few milliseconds. The proposed improved Dose Transformer Algorithm (iDoTA) maps arbitrary patient geometries and beam information (in the form of a 3D projected shape resulting from a simple ray tracing calculation) to their corresponding 3D dose distribution. Treating the 3D CT input and dose output volumes as a sequence of 2D slices along the direction of the photon beam, iDoTA solves the dose prediction task as sequence modeling. The proposed model combines a Transformer backbone routing long-range information between all elements in the sequence, with a series of 3D convolutions extracting local features of the data. We train iDoTA on a dataset of 1700 beam dose distributions, using 11 clinical volumetric modulated arc therapy (VMAT) plans (from prostate, lung and head and neck cancer patients with 194-354 beams per plan) to assess its accuracy and speed. iDoTA predicts individual photon beams in ~50 milliseconds with a high gamma pass rate of 97.72% (2 mm, 2%). Furthermore, estimating full VMAT dose distributions in 6-12 seconds, iDoTA achieves state-of-the-art performance with a 99.51% (2 mm, 2%) pass rate. Offering the sub-second speed needed in online and real-time adaptive treatments, iDoTA represents a new state of the art in data-driven photon dose calculation. The proposed model can massively speed-up current photon workflows, reducing calculation times from few minutes to just a few seconds

    A probabilistic deep learning model of inter-fraction anatomical variations in radiotherapy

    Get PDF
    In radiotherapy, the internal movement of organs between treatment sessions causes errors in the final radiation dose delivery. Motion models can be used to simulate motion patterns and assess anatomical robustness before delivery. Traditionally, such models are based on principal component analysis (PCA) and are either patient-specific (requiring several scans per patient) or population-based, applying the same deformations to all patients. We present a hybrid approach which, based on population data, allows to predict patient-specific inter-fraction variations for an individual patient. We propose a deep learning probabilistic framework that generates deformation vector fields (DVFs) warping a patient's planning computed tomography (CT) into possible patient-specific anatomies. This daily anatomy model (DAM) uses few random variables capturing groups of correlated movements. Given a new planning CT, DAM estimates the joint distribution over the variables, with each sample from the distribution corresponding to a different deformation. We train our model using dataset of 312 CT pairs from 38 prostate cancer patients. For 2 additional patients (22 CTs), we compute the contour overlap between real and generated images, and compare the sampled and ground truth distributions of volume and center of mass changes. With a DICE score of 0.86 and a distance between prostate contours of 1.09 mm, DAM matches and improves upon PCA-based models. The distribution overlap further indicates that DAM's sampled movements match the range and frequency of clinically observed daily changes on repeat CTs. Conditioned only on a planning CT and contours of a new patient without any pre-processing, DAM can accurately predict CTs seen during following treatment sessions, which can be used for anatomically robust treatment planning and robustness evaluation against inter-fraction anatomical changes

    Robustness analysis of CTV and OAR dose in clinical PBS-PT of neuro-oncological tumors:prescription-dose calibration and inter-patient variation with the Dutch proton robustness evaluation protocol

    Get PDF
    Objective:The Dutch proton robustness evaluation protocol prescribes the dose of the clinical target volume (CTV) to the voxel-wise minimum (VWmin) dose of 28 scenarios. This results in a consistent but conservative near-minimum CTV dose (D98%,CTV). In this study, we analyzed (i) the correlation between VWmin/voxel-wise maximum (VWmax) metrics and actually delivered dose to the CTV and organs at risk (OARs) under the impact of treatment errors, and (ii) the performance of the protocol before and after its calibration with adequate prescription-dose levels.Approach. Twenty-one neuro-oncological patients were included. Polynomial chaos expansion was applied to perform a probabilistic robustness evaluation using 100,000 complete fractionated treatments per patient. Patient-specific scenario distributions of clinically relevant dosimetric parameters for the CTV and OARs were determined and compared to clinical VWmin and VWmax dose metrics for different scenario subsets used in the robustness evaluation protocol.Main results. The inclusion of more geometrical scenarios leads to a significant increase of the conservativism of the protocol in terms of clinical VWmin and VWmax values for the CTV and OARs. The protocol could be calibrated using VWmin dose evaluation levels of 93.0%-92.3%, depending on the scenario subset selected. Despite this calibration of the protocol, robustness recipes for proton therapy showed remaining differences and an increased sensitivity to geometrical random errors compared to photon-based margin recipes.Significance. The Dutch proton robustness evaluation protocol, combined with the photon-based margin recipe, could be calibrated with a VWmin evaluation dose level of 92.5%. However, it shows limitations in predicting robustness in dose, especially for the near-maximum dose metrics to OARs. Consistent robustness recipes could improve proton treatment planning to calibrate residual differences from photon-based assumptions.</p

    Learning the Physics of Particle Transport via Transformers

    No full text
    Particle physics simulations are the cornerstone of nuclear engineering applications. Among them radiotherapy (RT) is crucial for society, with 50% of cancer patients receiving radiation treatments. For the most precise targeting of tumors, next generation RT treatments aim for real-time correction during radiation delivery, necessitating particle transport algorithms that yield precise dose distributions in sub-second times even in highly heterogeneous patient geometries. This is infeasible with currently available, purely physics based simulations. In this study, we present a data-driven dose calculation algorithm predicting the dose deposited by mono-energetic proton beams for arbitrary energies and patient geometries. Our approach frames particle transport as sequence modeling, where convolutional layers extract important spatial features into tokens and the transformer self-attention mechanism routes information between such tokens in the sequence and a beam energy token. We train our network and evaluate prediction accuracy using computationally expensive but accurate Monte Carlo (MC) simulations, considered the gold standard in particle physics. Our proposed model is 33 times faster than current clinical analytic pencil beam algorithms, improving upon their accuracy in the most heterogeneous and challenging geometries. With a relative error of 0.34卤0.2% and very high gamma pass rate of 99.59卤0.7% (1%, 3 mm), it also greatly outperforms the only published similar data-driven proton dose algorithm, even at a finer grid resolution. Offering MC precision 4000 times faster, our model could overcome a major obstacle that has so far prohibited real-time adaptive proton treatments and significantly increase cancer treatment efficacy. Its potential to model physics interactions of other particles could also boost heavy ion treatment planning procedures limited by the speed of traditional methods

    Derivation of mean dose tolerances for new fractionation schemes and treatment modalities

    Full text link
    Avoiding toxicities in radiotherapy requires the knowledge of tolerable organ doses. For new, experimental fractionation schemes (e.g. hypofractionation) these are typically derived from traditional schedules using the Biologically Effective Dose (BED) model. In this report we investigate the difficulties of establishing mean dose tolerances that arise since the mean BED depends on the entire spatial dose distribution, rather than on the dose level alone.&#13; &#13; Methods: A formula has been derived to establish mean physical dose constraints such that they are mean BED equivalent to a reference treatment scheme. This formula constitutes a modified BED equation where the influence of the spatial dose distribution is summarized in a single parameter, the dose shape factor. To quantify effects we analyzed 24 liver cancer patients for whom both proton and photon IMRT treatment plans were available.&#13; &#13; Results: The results show that the standard BED equation - neglecting the spatial dose distribution - can overestimate mean dose tolerances for hypofractionated treatments by up to 20%. The shape difference between photon and proton dose distributions can cause 30-40% differences in mean physical dose for plans having identical mean BEDs. Converting hypofractionated, 5/15-fraction proton doses to mean BED equivalent photon doses in traditional 35-fraction regimens resulted in up to 10 Gy higher doses than applying the standard BED formula. &#13; &#13; Conclusions: The dose shape effect should be accounted for to avoid overestimation of mean dose tolerances, particularly when estimating constraints for hypofractionated regimens. Additionally, tolerances established for one treatment modality cannot necessarily be applied to other modalities with drastically different dose distributions, such as proton therapy. Last, protons may only allow marginal (5-10%) dose escalation if a fraction-size adjusted organ mean dose is constraining instead of a physical dose

    UNCERTAINTY QUANTIFICATION IN STEADY STATE SIMULATIONS OF A MOLTEN SALT SYSTEM USING POLYNOMIAL CHAOS EXPANSION ANALYSIS

    No full text
    Uncertainty Quantification (UQ) of numerical simulations is highly relevant in the study and design of complex systems. Among the various approaches available, Polynomial Chaos Expansion (PCE) analysis has recently attracted great interest. It belongs to nonintrusive spectral projection methods and consists of constructing system responses as polynomial functions of the stochastic inputs. The limited number of required model evaluations and the possibility to apply it to codes without any modification make this technique extremely attractive. In this work, we propose the use of PCE to perform UQ of complex, multi-physics models for liquid fueled reactors, addressing key design aspects of neutronics and thermal fluid dynamics. Our PCE approach uses Smolyak sparse grids designed to estimate the PCE coefficients. To test its potential, the PCE method was applied to a 2D problem representative of the Molten Salt Fast Reactor physics. An in-house multi-physics tool constitutes the reference model. The studied responses are the maximum temperature and the effective multiplication factor. Results, validated by comparison with the reference model on 103 Monte-Carlo sampled points, prove the effectiveness of our PCE approach in assessing uncertainties of complex coupled models

    Preliminary uncertainty and sensitivity analysis of the Molten Salt Fast Reactor steady-state using a Polynomial Chaos Expansion method

    Get PDF
    In this work, we present the results of a preliminary uncertainty quantification and sensitivity analysis study of the Molten Salt Fast Reactor (MSFR) behavior at steady-state performed by applying a non-intrusive Polynomial Chaos Expansion (PCE) approach. An in-house high-fidelity multi-physics simulation tool is used as reactor reference model. Considering several thermal-hydraulics and neutronics parameters as stochastic inputs, with a limited number of samples we build a PCE meta-model able to reproduce he reactor response in terms of effective multiplication factor, maximum, minimum, and average salt temperatures, and complete salt temperature distribution. The probability density functions of the responses are constructed and analyzed, highlighting strengths and issues of the current MSFR design. The sensitivity study highlights the relative importance of each input parameter, thus providing useful indications for future research efforts. The analysis on the whole temperature field shows that the heat exchanger can be a critical component, so its design requires particular care.</p

    PTV-based VMAT vs. robust IMPT for head-and-neck cancer: A probabilistic uncertainty analysis of clinical plan evaluation with the Dutch model-based selection

    Get PDF
    Background and purpose: In the Netherlands, head-and-neck cancer (HNC) patients are referred for proton therapy (PT) through model-based selection (MBS). However, treatment errors may compromise adequate CTV dose. Our aims are: (i) to derive probabilistic plan evaluation metrics on the CTV consistent with clinical metrics; (ii) to evaluate plan consistency between photon (VMAT) and proton (IMPT) planning in terms of CTV dose iso-effectiveness and (iii) to assess the robustness of the OAR doses and of the risk toxicities involved in the MBS. Materials and methods: Sixty HNC plans (30 IMPT/30 VMAT) were included. A robustness evaluation with 100,000 treatment scenarios per plan was performed using Polynomial Chaos Expansion (PCE). PCE was applied to determine scenario distributions of clinically relevant dosimetric parameters, which were compared between the 2 modalities. Finally, PCE-based probabilistic dose parameters were derived and compared to clinical PTV-based photon and voxel-wise proton evaluation metrics. Results: Probabilistic dose to near-minimum volume v = 99.8% for the CTV correlated best with clinical PTV-D98% and VWmin-D98%,CTV doses for VMAT and IMPT respectively. IMPT showed slightly higher nominal CTV doses, with an average increase of 0.8 GyRBE in the median of the D99.8%,CTV distribution. Most patients qualified for IMPT through the dysphagia grade II model, for which an average NTCP gain of 10.5 percentages points (%-point) was found. For all complications, uncertainties resulted in moderate NTCP spreads lower than 3 p.p. on average for both modalities. Conclusion: Despite the differences between photon and proton planning, the comparison between PTV-based VMAT and robust IMPT is consistent. Treatment errors had a moderate impact on NTCPs, showing that the nominal plans are a good estimator to qualify patients for PT
    corecore